-
Notifications
You must be signed in to change notification settings - Fork 29
FixedSizeArrays support (And Memory Support) #1669
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
Codecov Report❌ Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #1669 +/- ##
==========================================
+ Coverage 68.16% 69.82% +1.66%
==========================================
Files 109 111 +2
Lines 11779 12157 +378
==========================================
+ Hits 8029 8489 +460
+ Misses 3750 3668 -82 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
With the latest commit, I'm outputting a It turned out to be more intricate than I initially thought, as I had to add support for In addition, I stumbled on the fact that IFRT is broken for Arrays at least: #1672 Tests are still to be done. If this is shippable I'll start writing tests for it. (and for |
Hmm, the builds are failing because they're running on v1.10 and Memory isn't defined there |
You can guard the if isdefined(Base, :Memory)
# ...
end Relatedly, note that |
I've found a problem with this implementation, it only works in the 1D case. the parent of a In the latest commit, I reshaped the RArray to the shape of the underlying FixedSizeArray. |
8625b1a
to
658a3d7
Compare
Those parts have been refactored into their own functions (with |
probably should be |
I can also reproduce it with julia> using FixedSizeArrays, Reactant
julia> x = FixedSizeArray(randn(3, 4));
julia> rx = Reactant.to_rarray(x);
ERROR: MethodError: no method matching make_buffer_array(::Reactant.XLA.PJRT.Client, ::Memory{Float64}, ::Reactant.XLA.PJRT.Device)
The function `make_buffer_array` exists, but no method is defined for this combination of argument types.
Closest candidates are:
make_buffer_array(::Reactant.XLA.PJRT.Client, ::Array{T, N}, ::Reactant.XLA.PJRT.Device) where {T, N}
@ Reactant ~/.julia/packages/Reactant/eaYLR/src/xla/PJRT/Buffer.jl:9
Stacktrace: |
@inline ConcreteRArray{T}(::UndefInitializer, shape::Integer...; kwargs...) where {T} = ConcreteRArray{ | ||
T | ||
}( | ||
undef, Dims(shape); kwargs... | ||
) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@inline ConcreteRArray{T}(::UndefInitializer, shape::Integer...; kwargs...) where {T} = ConcreteRArray{ | |
T | |
}( | |
undef, Dims(shape); kwargs... | |
) | |
@inline ConcreteRArray{T}(::UndefInitializer, shape::Integer...; kwargs...) where {T} = | |
ConcreteRArray{T}(undef, Dims(shape); kwargs...) |
julia> using FixedSizeArrays, Reactant
julia> M = randn(3, 4);
julia> M_fs = FixedSizeMatrix(M);
julia> rM = Reactant.to_rarray(M);
julia> rM_fs = Reactant.to_rarray(M_fs);
julia> fn(x, y) = sin.(x) .+ cos.(y)
fn (generic function with 1 method)
julia> @code_hlo fn(rM, rM)
module @reactant_fn attributes {mhlo.num_partitions = 1 : i64, mhlo.num_replicas = 1 : i64} {
func.func @main(%arg0: tensor<4x3xf64>) -> tensor<4x3xf64> attributes {enzymexla.memory_effects = []} {
%0 = stablehlo.sine %arg0 : tensor<4x3xf64>
%1 = stablehlo.cosine %arg0 : tensor<4x3xf64>
%2 = stablehlo.add %0, %1 : tensor<4x3xf64>
return %2 : tensor<4x3xf64>
}
}
julia> @code_hlo fn(rM_fs, rM_fs)
module @reactant_fn attributes {mhlo.num_partitions = 1 : i64, mhlo.num_replicas = 1 : i64} {
func.func @main(%arg0: tensor<12xf64>) -> tensor<4x3xf64> attributes {enzymexla.memory_effects = []} {
%0 = stablehlo.reshape %arg0 : (tensor<12xf64>) -> tensor<4x3xf64>
%1 = stablehlo.sine %0 : tensor<4x3xf64>
%2 = stablehlo.cosine %0 : tensor<4x3xf64>
%3 = stablehlo.add %1, %2 : tensor<4x3xf64>
return %3 : tensor<4x3xf64>
}
} Why there's an extra |
presumably the memory itself is being stored 1-dimensionally |
It is, as it the case for |
kwargs..., | ||
) where {T,N} | ||
shape = size(prev) | ||
return reshape( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I guess this reshape
is the culprit? How does Array
work? Is it possible to construct this object directly with the right size instead of reshaping it?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
but array itself we override to keep the dimensionality and allocate everything ourselves for Array.
if the actual tracing of a fixedsizearray does the current "generic recursion into structs" it will eventually allocate a 1-dim memory, always
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The whole point of FixedSizeArray
is that the size is...well...fixed. Having to reshape it all the time seems to go into the opposite direction, especially when Array
doesn't have that.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It feels to me like make_tracer
should take the size as an (optional) argument. Looking at
Lines 1177 to 1196 in e4bb34f
Base.@nospecializeinfer function make_tracer( | |
seen, | |
@nospecialize(prev::ConcreteIFRTArray{T,N}), | |
@nospecialize(path), | |
mode; | |
@nospecialize(sharding = Sharding.NoSharding()), | |
@nospecialize(device = nothing), | |
@nospecialize(client = nothing), | |
kwargs..., | |
) where {T,N} | |
if mode == TracedToTypes | |
throw("Cannot have ConcreteIFRTArray as function call argument.") | |
end | |
mode == ArrayToConcrete && return ConcreteIFRTArray(prev; sharding, device, client) | |
mode != ConcreteToTraced && throw("Cannot trace concrete") | |
haskey(seen, prev) && return seen[prev]::TracedRArray{T,N} | |
res = TracedRArray{T,N}((path,), nothing, size(prev)) | |
seen[prev] = res | |
return res | |
end |
size(prev)
but could be overridden if passed explicitly.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I opened #1712 to implement my suggestion.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This function needs to be updated to return a FixedSizedArray where the internal data would be a Traced/Concrete Array (#1669 (comment))
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I guess I'm missing something fundamental: Array
has the same memory backend, why should FixedSizeArray
be any different?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Array maps to our Concrete/Traced Array. For all "wrapped" types, we need to preserve the wrapper type, if we want the outputs to preserve the wrapped type. Else any operation you perform on a FixedSizeArray with inevitably be mapped to an Array output
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not really convinced a proposal for changing the memory backend in FixedSizeArrays is going to fly: it's meant to follow Array very closely, and the memory backend is always a flattened dense vector.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I mean once #1696 goes in, I think we should be able to make that work (though the backing memory might need to be a reshape(tracedrarray{n}, 1), but the reshape will optimize out with the one that's emitted currently
julia> M_fs = FixedSizeMatrix(M)
3×4 FixedSizeArray{Float64, 2, Memory{Float64}}:
0.94348 2.80883 0.565887 -0.626701
-0.235831 -1.81099 0.232086 -0.460039
0.816708 1.20653 1.16788 -2.09359
julia> rM_fs = Reactant.to_rarray(M_fs)
3×4 reshape(::ConcretePJRTArray{Float64,1}, 3, 4) with eltype Float64:
0.94348 2.80883 0.565887 -0.626701
-0.235831 -1.81099 0.232086 -0.460039
0.816708 1.20653 1.16788 -2.09359 we shouldn't unwrap the array type here |
using FixedSizeArrays | ||
using Reactant | ||
using Reactant: TracedRArray, TracedRNumber, Ops | ||
using ReactantCore: ReactantCore |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
using FixedSizeArrays | |
using Reactant | |
using Reactant: TracedRArray, TracedRNumber, Ops | |
using ReactantCore: ReactantCore | |
using FixedSizeArrays: FixedSizeArrayDefault | |
using Reactant: Reactant, TracedRArray, TracedRNumber, Ops | |
using ReactantCore: ReactantCore |
using ReactantCore: ReactantCore | ||
|
||
function Reactant.traced_type_inner( | ||
@nospecialize(_::Type{FixedSizeArrays.FixedSizeArrayDefault{T,N}}), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@nospecialize(_::Type{FixedSizeArrays.FixedSizeArrayDefault{T,N}}), | |
@nospecialize(_::Type{FixedSizeArrayDefault{T,N}}), |
|
||
Base.@nospecializeinfer function Reactant.make_tracer( | ||
seen, | ||
@nospecialize(prev::FixedSizeArrays.FixedSizeArrayDefault{T,N}), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@nospecialize(prev::FixedSizeArrays.FixedSizeArrayDefault{T,N}), | |
@nospecialize(prev::FixedSizeArrayDefault{T,N}), |
kwargs..., | ||
) where {T,N} | ||
shape = size(prev) | ||
return reshape( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Missing the call to FixedSizedArray
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Returning a FixedSizeArray causes problems.
With track_numbers=Number
I get this: (@jit(f(rx))
)
ERROR: TypeError: in RNumber, in T, expected T<:Union{Complex{Int16}, Complex{Int32}, Complex{Int64}, Complex{Int8}, Complex{UInt16}, Complex{UInt32}, Complex{UInt64}, Complex{UInt8}, Core.BFloat16, Bool, Float16, Float32, Float64, Int16, Int32, Int64, Int8, UInt16, UInt32, UInt64, UInt8, Reactant.F8E4M3B11FNUZ, Reactant.F8E4M3FN, Reactant.F8E4M3FNUZ, Reactant.F8E5M2, Reactant.F8E5M2FNUZ, ComplexF64, ComplexF32}, got Type{Reactant.TracedRNumber{Float32}}
Stacktrace:
[1] copyto!(dest::Memory{Reactant.TracedRNumber{Float32}}, src::Vector{Reactant.TracedRNumber{Float32}})
@ Reactant.TracedRArrayOverrides ~/projects/Reactant.jl/src/TracedRArray.jl:492
[2] collect_as_vectorlike_with_known_eltype_and_length(::Type{Memory{Reactant.TracedRNumber{Float32}}}, collection::Vector{Reactant.TracedRNumber{Float32}})
@ Collects ~/.julia/packages/Collects/iMuH5/src/Collects.jl:269
[3] collect_as_memory_with_known_eltype_and_known_length(::Type{Reactant.TracedRNumber{Float32}}, collection::Vector{Reactant.TracedRNumber{Float32}})
@ Collects ~/.julia/packages/Collects/iMuH5/src/Collects.jl:345
[4] collect_as_memory_with_known_eltype(::Type{Reactant.TracedRNumber{Float32}}, collection::Vector{Reactant.TracedRNumber{Float32}})
@ Collects ~/.julia/packages/Collects/iMuH5/src/Collects.jl:353
[5] collect_as_memory(::Function, ::Type{Memory{Reactant.TracedRNumber{Float32}}}, collection::Vector{Reactant.TracedRNumber{Float32}})
@ Collects ~/.julia/packages/Collects/iMuH5/src/Collects.jl:373
[6] collect_as_common_invariant(e::typeof(Collects.EmptyIteratorHandling.just_throws), ::Type{Memory{Reactant.TracedRNumber{Float32}}}, collection::Vector{Reactant.TracedRNumber{Float32}})
@ Collects ~/.julia/packages/Collects/iMuH5/src/Collects.jl:376
[7] collect_as_common(e::typeof(Collects.EmptyIteratorHandling.just_throws), type::Type{Memory{Reactant.TracedRNumber{Float32}}}, collection::Vector{Reactant.TracedRNumber{Float32}})
@ Collects ~/.julia/packages/Collects/iMuH5/src/Collects.jl:400
[8] (::Collects.Collect{typeof(Collects.EmptyIteratorHandling.just_throws)})(type::Type{Memory{Reactant.TracedRNumber{Float32}}}, collection::Vector{Reactant.TracedRNumber{Float32}})
@ Collects ~/.julia/packages/Collects/iMuH5/src/Collects.jl:411
[9] (::Collects.Collect{typeof(Collects.EmptyIteratorHandling.just_throws)})(::Type{FixedSizeArrayDefault{Reactant.TracedRNumber{Float32}}}, iterator::Vector{Reactant.TracedRNumber{Float32}})
@ FixedSizeArrays ~/.julia/packages/FixedSizeArrays/VHLXx/src/collect_as.jl:45
[10] collect_as(::Type{FixedSizeArrayDefault{Reactant.TracedRNumber{Float32}}}, collection::Vector{Reactant.TracedRNumber{Float32}}; empty_iterator_handler::typeof(Collects.EmptyIteratorHandling.just_throws))
@ Collects ~/.julia/packages/Collects/iMuH5/src/Collects.jl:425
[11] collect_as(::Type{FixedSizeArrayDefault{Reactant.TracedRNumber{Float32}}}, collection::Vector{Reactant.TracedRNumber{Float32}})
@ Collects ~/.julia/packages/Collects/iMuH5/src/Collects.jl:423
[12] collect_as_haseltype(::Type{FixedSizeArrayDefault}, iterator::Vector{Reactant.TracedRNumber{Float32}})
@ FixedSizeArrays ~/.julia/packages/FixedSizeArrays/VHLXx/src/collect_as.jl:56
[13] converting_constructor(::Type{FixedSizeArrayDefault}, src::Vector{Reactant.TracedRNumber{Float32}})
@ FixedSizeArrays ~/.julia/packages/FixedSizeArrays/VHLXx/src/FixedSizeArray.jl:334
[14] (FixedSizeArrayDefault)(src::Vector{Reactant.TracedRNumber{Float32}})
@ FixedSizeArrays ~/.julia/packages/FixedSizeArrays/VHLXx/src/FixedSizeArray.jl:345
[15] make_tracer(seen::Reactant.OrderedIdDict{Any, Any}, prev::FixedSizeArray{Float32, 1, Memory{Float32}}, path::Any, mode::Reactant.TraceMode; kwargs::@Kwargs{runtime::Val{:PJRT}})
@ ReactantFixedSizeArraysExt ~/projects/Reactant.jl/ext/ReactantFixedSizeArraysExt.jl:33
I don't know why it's throwing this, as after inspection the inner Memory
object is traced properly.
Without it I get:
ERROR: cannot copy Ptr{Nothing} @0x0000795f973dc1e0 of type Ptr{Nothing}
Stacktrace:
[1] error(s::String)
@ Base ./error.jl:35
[2] create_result(tocopy::Ptr{…}, path::Tuple{…}, result_stores::Dict{…}, path_to_shard_info::Nothing, to_unreshard_results::Dict{…}, unresharded_code::Vector{…}, unresharded_arrays_cache::Dict{…}, used_shardinfo::Set{…}, result_cache::IdDict{…}, var_idx::Base.RefValue{…}, resultgen_code::Vector{…})
@ Reactant.Compiler ~/projects/Reactant.jl/src/Compiler.jl:231
[3] create_result(tocopy::Memory{…}, path::Tuple{…}, result_stores::Dict{…}, path_to_shard_info::Nothing, to_unreshard_results::Dict{…}, unresharded_code::Vector{…}, unresharded_arrays_cache::Dict{…}, used_shardinfo::Set{…}, result_cache::IdDict{…}, var_idx::Base.RefValue{…}, resultgen_code::Vector{…})
@ Reactant.Compiler ~/projects/Reactant.jl/src/Compiler.jl:256
[4] create_result(tocopy::FixedSizeArray{…}, path::Tuple{}, result_stores::Dict{…}, path_to_shard_info::Nothing, to_unreshard_results::Dict{…}, unresharded_code::Vector{…}, unresharded_arrays_cache::Dict{…}, used_shardinfo::Set{…}, result_cache::IdDict{…}, var_idx::Base.RefValue{…}, resultgen_code::Vector{…})
@ Reactant.Compiler ~/projects/Reactant.jl/src/Compiler.jl:256
[5] codegen_unflatten!(linear_args::Vector{…}, preserved_args::Vector{…}, concretized_res_names::Vector{…}, linear_results::Vector{…}, concrete_result::FixedSizeArray{…}, result_stores::Dict{…}, path_to_shard_info::Nothing, linear_result_shard_info::Vector{…}, client::Reactant.XLA.PJRT.Client, resharded_inputs::Dict{…})
@ Reactant.Compiler ~/projects/Reactant.jl/src/Compiler.jl:3147
[6] compile(f::Function, args::Tuple{…}; kwargs::@Kwargs{…})
@ Reactant.Compiler ~/projects/Reactant.jl/src/Compiler.jl:3599
[7] top-level scope
@ ~/projects/Reactant.jl/src/Compiler.jl:2614
For this one I don't have a clue.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So:
mem = Memory{Float32}([1.0f0, 2.0f0])
getfield(mem, 2)
Ptr{nothing} <pointer address>
This is where the pointer is coming from.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Since Reactant isn't supporting resizing for normal arrays, and FixedSizeArray isn't doing any fancy memory/optimization stuff other than fixed size (unlike OneHotArrays, FillArrays) shouldn't it be fine for the FixedSizeArray to transform into an ordinary Concrete Array?
#1649
Struggling with the Memory object. This is what I'm getting currently: